1,231 research outputs found

    Study of the X-ray activity of Sgr A* during the 2011 XMM-Newton campaign

    Full text link
    In Spring 2011 we observed Sgr A*, the supermassive black hole at the center of our Galaxy, with XMM-Newton with a total exposure of ~226 ks in coordination with the 1.3 mm VLBI. We have performed timing analysis of the X-ray emission from Sgr A* using Bayesian blocks algorithm to detect X-ray flares observed with XMM-Newton. Furthermore, we computed X-ray smoothed light curves observed in this campaign in order to have better accuracy on the position and the amplitude of the flares. We detected 2 X-ray flares on the 2011 March 30 and April 3 which have for comparison a peak detection level of 6.8 and 5.9 sigma in the XMM-Newton/EPIC light curve in the 2-10 keV energy range with a 300 s bin. The former is characterized by 2 sub-flares: the first one is very short (~458 s) with a peak luminosity of ~9.4E34 erg/s whereas the second one is longer (~1542 s) with a lower peak luminosity of ~6.8E34 erg/s. The comparison with the sample of X-ray flares detected during the 2012 Chandra XVP campaign favors the hypothesis that the 2011 March 30 flare is a single flare rather than 2 distinct sub-flares. We model the light curve of this flare with the gravitational lensing of a simple hotspot-like structure but we can not satisfactorily reproduce the large decay of the light curve between the 2 sub-flares with this model. From magnetic energy heating during the rise phase of the first sub-flare and assuming an X-ray photons production efficiency of 1 and a magnetic field of 100 G at 2 r_g, we derive an upper limit to the radial distance of the first sub-flare of 100 r_g. We estimate using the decay phase of the first sub-flare a lower limit to the radial distance of 4 r_g from synchrotron cooling in the infrared. The X-ray emitting region of the first sub-flare is located at a radial position of 4-100 and has a corresponding radius of 1.8-2.87 in r_g unit for a magnetic field of 100 G at 2 r_g.Comment: Version published in A&A + corrigendum published in A&

    Side Cutting Biopsy Needle for Endoscopes

    Get PDF
    Develop a side cutting biopsy needle that fits through the working channel of the endoscope similar to stereotactic needle with syringe suction to overcome the small biopsy samples due to instrument size limitations. The problem users are facing is that the biopsy samples through the endoscope are small secondary to instrument size limitations. The idea for this problem is to develop a side cutting biopsy needle that fits through the working channel of the endoscope similar to stereotactic needle, syringe suction

    Simple Key Enumeration (and Rank Estimation) using Histograms: an Integrated Approach

    Get PDF
    The main contribution of this paper, is a new key enumeration algorithm that combines the conceptual simplicity of the rank estimation algorithm of Glowacz et al. (from FSE 2015) and the parallelizability of the enumeration algorithm of Bogdanov et al. (SAC 2015) and Martin et al. (from ASIACRYPT 2015). Our new algorithm is based on histograms. It allows obtaining simple bounds on the (small) rounding errors that it introduces and leads to straightforward parallelization. We further show that it can minimize the bandwidth of distributed key testing by selecting parameters that maximize the factorization of the lists of key candidates produced by the enumeration, which can be highly beneficial, e.g. if these tests are performed by a hardware coprocessor. We also put forward that the conceptual simplicity of our algorithm translates into efficient implementations (that slightly improve the state-of-the-art). As an additional consolidating effort, we finally describe an open source implementation of this new enumeration algorithm, combined with the FSE 2015 rank estimation one, that we make available with the paper

    Punctured Syndrome Decoding Problem Efficient Side-Channel Attacks Against Classic McEliece

    Get PDF
    Among the fourth round finalists of the NIST post-quantum cryptography standardization process for public-key encryption algorithms and key encapsulation mechanisms, three rely on hard problems from coding theory. Key encapsulation mechanisms are frequently used in hybrid cryptographic systems: a public-key algorithm for key exchange and a secret key algorithm for communication. A major point is thus the initial key exchange that is performed thanks to a key encapsulation mechanism. In this paper, we analyze side-channel vulnerabilities of the key encapsulation mechanism implemented by the Classic McEliece cryptosystem, whose security is based on the syndrome decoding problem. We use side-channel leakages to reduce the complexity of the syndrome decoding problem by reducing the length of the code considered. The columns punctured from the original code reduce the complexity of a hard problem from coding theory. This approach leads to efficient profiled side-channel attacks that recover the session key with high success rates, even in noisy scenarios

    Horizontal Correlation Attack on Classic McEliece

    Get PDF
    As the technical feasibility of a quantum computer becomes more and more likely, post-quantum cryptography algorithms are receiving particular attention in recent years. Among them, code-based cryptosystems were first considered unsuited for hardware and embedded software implementations because of their very large key sizes. However, recent work has shown that such implementations are practical, which also makes them susceptible to physical attacks. In this article, we propose a horizontal correlation attack on the Classic McEliece cryptosystem, more precisely on the matrix-vector multiplication over F2\mathbb{F}_2 that computes the shared key in the encapsulation process. The attack is applicable in the broader context of Niederreiter-like code-based cryptosystems and is independent of the code structure, i.e. it does not need to exploit any particular structure in the parity check matrix. Instead, we take advantage of the constant time property of the matrix-vector multiplication over F2\mathbb{F}_2. We extend the feasibility of the basic attack by leveraging information-set decoding methods and carry it out successfully on the reference embedded software implementation. Interestingly, we highlight that implementation choices, like the word size or the compilation options, play a crucial role in the attack success, and even contradict the theoretical analysis

    Self-Timed Masking: Implementing Masked S-Boxes Without Registers

    Get PDF
    Masking is one of the most used side-channel protection techniques. However, a secure masking scheme requires additional implementation costs, e.g. random number, and transistor count. Furthermore, glitches and early evaluation can temporally weaken a masked implementation in hardware, creating a potential source of exploitable leakages. Registers are generally used to mitigate these threats, hence increasing the implementation\u27s area and latency. In this work, we show how to design glitch-free masking without registers with the help of the dual-rail encoding and asynchronous logic. This methodology is used to implement low-latency masking with arbitrary protection order. Finally, we present a side-channel evaluation of our first and second order masked AES implementations

    Natural History and Outcome of Hepatic Vascular Malformations in a Large Cohort of Patients with Hereditary Hemorrhagic Teleangiectasia

    Get PDF
    BACKGROUND: Hereditary hemorrhagic telangiectasia is a genetic disease characterized by teleangiectasias involving virtually every organ. There are limited data in the literature regarding the natural history of liver vascular malformations in hemorrhagic telangiectasia and their associated morbidity and mortality. AIM: This prospective cohort study sought to assess the outcome of liver involvement in hereditary hemorrhagic telangiectasia patients. METHODS: We analyzed 16 years of surveillance data from a tertiary hereditary hemorrhagic telangiectasia referral center in Italy. We considered for inclusion in this study 502 consecutive Italian patients at risk of hereditary hemorrhagic telangiectasia who presented at the hereditary hemorrhagic telangiectasia referral center and underwent a multidisciplinary screening protocol for the diagnosis of hereditary hemorrhagic telangiectasia. Of the 502 individuals assessed in the center, 154 had hepatic vascular malformations and were the subject of the study; 198 patients with hereditary hemorrhagic telangiectasia and without hepatic vascular malformations were the controls. Additionally, we report the response to treatment of patients with complicated hepatic vascular malformations. RESULTS: The 154 patients were included and followed for a median period of 44 months (range 12-181); of these, eight (5.2%) died from VM-related complications and 39 (25.3%) experienced complications. The average incidence rates of death and complications were 1.1 and 3.6 per 100 person-years, respectively. The median overall survival and event-free survival after diagnosis were 175 and 90 months, respectively. The rate of complete response to therapy was 63%. CONCLUSIONS: This study shows that substantial morbidity and mortality are associated with liver vascular malformations in hereditary hemorrhagic telangiectasia patients

    Strategies to Target Tumor Immunosuppression

    Get PDF
    The tumor microenvironment is currently in the spotlight of cancer immunology research as a key factor impacting tumor development and progression. While antigen-specific immune responses play a crucial role in tumor rejection, the tumor hampers these immune responses by creating an immunosuppressive microenvironment. Recently, major progress has been achieved in the field of cancer immunotherapy, and several groundbreaking clinical trials demonstrated the potency of such therapeutic interventions in patients. Yet, the responses greatly vary among individuals. This calls for the rational design of more efficacious cancer immunotherapeutic interventions that take into consideration the “immune signature” of the tumor. Multimodality treatment regimens that aim to enhance intratumoral homing and activation of antigen-specific immune effector cells, while simultaneously targeting tumor immunosuppression, are pivotal for potent antitumor immunity

    Future mmVLBI Research with ALMA: A European vision

    Get PDF
    Very long baseline interferometry at millimetre/submillimetre wavelengths (mmVLBI) offers the highest achievable spatial resolution at any wavelength in astronomy. The anticipated inclusion of ALMA as a phased array into a global VLBI network will bring unprecedented sensitivity and a transformational leap in capabilities for mmVLBI. Building on years of pioneering efforts in the US and Europe the ongoing ALMA Phasing Project (APP), a US-led international collaboration with MPIfR-led European contributions, is expected to deliver a beamformer and VLBI capability to ALMA by the end of 2014 (APP: Fish et al. 2013, arXiv:1309.3519). This report focuses on the future use of mmVLBI by the international users community from a European viewpoint. Firstly, it highlights the intense science interest in Europe in future mmVLBI observations as compiled from the responses to a general call to the European community for future research projects. A wide range of research is presented that includes, amongst others: - Imaging the event horizon of the black hole at the centre of the Galaxy - Testing the theory of General Relativity an/or searching for alternative theories - Studying the origin of AGN jets and jet formation - Cosmological evolution of galaxies and BHs, AGN feedback - Masers in the Milky Way (in stars and star-forming regions) - Extragalactic emission lines and astro-chemistry - Redshifted absorption lines in distant galaxies and study of the ISM and circumnuclear gas - Pulsars, neutron stars, X-ray binaries - Testing cosmology - Testing fundamental physical constantsComment: Replaced figures 2 and 3: corrected position SRT. Corrected minor typo in 5.
    corecore